SNOPT : An SQP Algorithm for Large - Scale Constrained Optimization ∗ Philip
نویسندگان
چکیده
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are sparse. Second derivatives are assumed to be unavailable or too expensive to calculate. We discuss an SQP algorithm that uses a smooth augmented Lagrangian merit function and makes explicit provision for infeasibility in the original problem and the QP subproblems. The Hessian of the Lagrangian is approximated using a limited-memory quasi-Newton method. SNOPT is a particular implementation that uses a reduced-Hessian semidefinite QP solver (SQOPT) for the QP subproblems. It is designed for problems with many thousands of constraints and variables but is best suited for problems with a moderate number of degrees of freedom (say, up to 2000). Numerical results are given for most of the CUTEr and COPS test collections (about 1020 examples of all sizes up to 40000 constraints and variables, and up to 20000 degrees of freedom).
منابع مشابه
SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization
Sequential quadratic programming (SQP) methods have proved highly effective for solving constrained optimization problems with smooth nonlinear functions in the objective and constraints. Here we consider problems with general inequality constraints (linear and nonlinear). We assume that first derivatives are available and that the constraint gradients are sparse. We discuss an SQP algorithm th...
متن کاملUsers Guide for SnadiOpt: A Package Adding Automatic Differentiation to Snopt
SnadiOpt is a package that supports the use of the automatic differentiation package ADIFOR with the optimization package Snopt. Snopt is a general-purpose system for solving optimization problems with many variables and constraints. It minimizes a linear or nonlinear function subject to bounds on the variables and sparse linear or nonlinear constraints. It is suitable for large-scale linear an...
متن کاملA Barrier Algorithm for Large Nonlinear Optimization Problems
The problem of large-scale constrained optimization is addressed. A barrier function is used to transform the problem into a sequence of subproblems with nonlinear equality constraints. Barrier methods differ primarily in how such subproblems are solved. The method used here for each subproblem is similar to what the second-derivative method of Murray and Prieto (MP) reduces to when applied to ...
متن کاملCONSTRAINED BIG BANG-BIG CRUNCH ALGORITHM FOR OPTIMAL SOLUTION OF LARGE SCALE RESERVOIR OPERATION PROBLEM
A constrained version of the Big Bang-Big Crunch algorithm for the efficient solution of the optimal reservoir operation problems is proposed in this paper. Big Bang-Big Crunch (BB-BC) algorithm is a new meta-heuristic population-based algorithm that relies on one of the theories of the evolution of universe namely, the Big Bang and Big Crunch theory. An improved formulation of the algorithm na...
متن کاملSequential Quadratic Programming �
Introduction Since its popularization in the late s Sequential Quadratic Program ming SQP has arguably become the most successful method for solving nonlinearly constrained optimization problems As with most optimization methods SQP is not a single algorithm but rather a conceptual method from which numerous speci c algorithms have evolved Backed by a solid theoretical and computational foundat...
متن کامل